Accessibility settings

Published on in Vol 28 (2026)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/81059, first published .
Child Vaccination Status and Behavioral and Social Drivers of Vaccination Among Their Caregivers in the Philippines: Cross-Sectional Survey Study Comparison of Household, Mobile, and Online Modes

Child Vaccination Status and Behavioral and Social Drivers of Vaccination Among Their Caregivers in the Philippines: Cross-Sectional Survey Study Comparison of Household, Mobile, and Online Modes

Child Vaccination Status and Behavioral and Social Drivers of Vaccination Among Their Caregivers in the Philippines: Cross-Sectional Survey Study Comparison of Household, Mobile, and Online Modes

1Global Immunization Division, Centers for Disease Control and Prevention, 1600 Clifton Rd NE, Atlanta, GA, United States

2IDInsight, Makati, Philippines

3Centers for Disease Control and Prevention, Global Health Center, Atlanta, GA, United States

4The Task Force for Global Health, Atlanta, GA, United States

5Republic of the Philippines Department of Health, Manila, Philippines

Corresponding Author:

Kimberly E Bonner, MPA, PhD


Background: The World Health Organization recommends that countries routinely collect data on the behavioral and social drivers (BeSD) of vaccination to inform public health interventions that increase vaccine uptake. There is a need to identify data collection methods that can rapidly and inexpensively collect representative data, particularly in low- and middle-income countries.

Objective: This study aimed to understand BeSD drivers of vaccination in the Philippines and assess the trade-offs between survey methods. We compared responses to household, mobile, and online surveys in terms of demographics, vaccination status, responses to BeSD questions, and cost.

Methods: We conducted concurrent household, mobile (SMS text messaging and interactive voice response), and online surveys among caregivers of children 2 years of age and below in Regions V and XII of the Philippines, with sampling differing by survey method. We assessed, for each survey method, (1) respondent demographics (sex, age, region, and socioeconomic status) and (2) the weighted proportion of responses from caregivers of children who received at least one dose of diphtheria-pertussis-tetanus (DPT)–containing vaccine. We estimated the weighted proportion of each BeSD survey response option and calculated the financial cost (monetary outlays) per survey response from an implementer’s perspective by summing the costs incurred in each survey method and dividing by the number of responses received.

Results: We surveyed a total of 1201 household respondents, 2153 mobile respondents, and 398 online respondents from January to March 2025. We found that online and mobile survey respondents were more likely to be male and have completed high school than household survey respondents. The weighted proportion of respondents indicating that their child had received at least one dose of DPT vaccine was 91.8% (n=1090; 95% CI 90%‐93.3%) for the household survey, 90.3% (n=1853) for the mobile survey, and 85% (n=346) for the online survey. With regard to vaccine demand, more than 85% of respondents in each survey method indicated that vaccines are very important, very safe, supported by family, and that they knew where to bring a child for vaccination. More than 30% of mobile and online survey respondents indicated that it was not easy to pay for vaccination. The financial cost to conduct the survey per survey response was US $2.61 for the online survey, US $6.93 for the mobile survey, and US $29.38 for the household survey.

Conclusions: In the Philippines, household, mobile, and online survey methods reached caregivers of children who were unvaccinated against DPT, and these proportions were similar across survey methods. BeSD responses indicated high vaccine demand and challenges in caregivers’ cost to access vaccination. Determining the most appropriate survey method depends on trade-offs between representativeness and costs. However, areas with strong connectivity and high mobile device ownership can consider mobile and online methods as a lower-cost alternative to rapidly collect BeSD data.

J Med Internet Res 2026;28:e81059

doi:10.2196/81059

Keywords



In low- and middle-income countries (LMICs), there is a pressing need for methods of collecting health data beyond household surveys that can inform public health practice [1], particularly for measuring drivers of vaccine demand [2,3].

While household surveys are considered a gold standard for population-representative data [4], there are drawbacks to this method, including declining participation in urban areas, household survey costs, exclusion of nonstationary households [5], and infrequent periodicity [6]. Mobile phone surveys and online surveys offer promising alternatives to household surveys, particularly in areas with high mobile phone ownership rates and network penetration [7]. One key concern of both mobile and online surveys is that the population responding to these surveys might not be representative of the overall population of interest [8,9], and the responses to survey questions might differ by survey modality [10]. While some comparisons between remote data collection via self-administered SMS text messaging surveys have been made in LMICs, these assessments focused on nutrition indicators [10]. While mobile methods for social mobilization and vaccination reminders have been extensively employed in vaccination activities [11], few studies have examined bidirectional messaging or data collection.

The World Health Organization recommends that countries routinely monitor the behavioral and social drivers (BeSD) of vaccination at the community level using a series of validated questions derived from the BeSD framework [12]. An understanding of the drivers of vaccination in a population enables public health programs to develop tailored interventions that address specific population needs and concerns regarding vaccination. Of particular importance is understanding the needs of unvaccinated, also known as zero-dose, populations, defined as receiving zero doses of diphtheria-pertussis-tetanus (DPT)–containing vaccine. There is a need to identify other survey methods that can rapidly collect BeSD data at scale [13]. In the context of BeSD surveys, there is a need to understand how the proportion of respondents with unvaccinated and under-vaccinated children differs by survey method, including its associated costs.

The Philippines has high mobile phone penetration, widespread internet coverage, a high population of children who have not received recommended vaccines (zero-dose) [14], and documented barriers to vaccine demand [15]. Mobile phone ownership was 122 per 100 people [16], with an estimated 92% of households possessing at least 1 mobile phone in 2022 [17]. Cellular network coverage was 99%, and third generation of cellular network technology (3G) network access, which enables internet connectivity, was 96% in 2024 [18]. In addition, the Philippines has a longstanding community health worker system of Barangay Health Workers (BHWs), based in the smallest administrative unit (barangay). BHWs support primary care service delivery by maintaining contact with a designated list of families in their catchment area [19]. Self-administered mobile surveys distributed by BHWs offer an opportunity to leverage the potential speed and scalability of mobile data methods while garnering an improved response rate over random digital dial through the use of trusted BHW distribution networks. In addition, the Department of Health has established an online survey platform that can be readily accessed through a web search, Facebook, or Viber.

The overall objective of this study is to compare respondent characteristics and BeSD survey responses using 3 different data collection methods: household surveys, self-administered mobile surveys distributed via SMS text messaging by BHWs, and online surveys. Specific aims of this study are to (1) describe respondent demographics and vaccination status by data collection method, (2) compare how the distribution of responses to BeSD questions varies by survey method, and (3) describe the process of implementing the 3 survey methods, including financial costs from an implementer’s perspective and a data quality assessment of response accuracy in the mobile survey.


Recruitment

Participants for each survey method were recruited from 2 regions in the Philippines: Region V (Bicol) and Region XII (SOCCSKSARGEN). These regions were selected due to mobile penetration above 90% [17], a high population of children who received zero doses of DPT or with an incomplete vaccination series (30.1% unvaccinated in Region XII [17]; 64.7% under-vaccinated in Region V [20]), and security conditions that permit in-person data collection. Participants for each survey method were eligible if they were 18 years of age or older, were caregivers of a child 2 years old or younger, and were residents of Region V or XII during the survey period.

Sampling differed by survey method (Table 1). In the household survey, a 2-stage cluster sampling approach was used, drawing from the barangay, the smallest administrative units in each region. In first-stage cluster sampling, 91 barangays in Region V and 63 barangays in Region XII were selected using probability proportional to size, drawn from 2020 census data. The Philippines health system engages BHWs to maintain a comprehensive list of households in their catchment area to ensure that these households receive relevant health information and services. The second stage drew a simple random sample of 8 households with a child under 2 years of age from the BHW listing from each barangay, for a total target of 1200 households. The estimated sample size of 1200 for the household survey was calculated to ensure a margin of error on the outcome of interest of 5 percentage points or fewer, assuming 150 clusters, 8 households per cluster, an intraclass correlation of 0.3 on DPT vaccination status, and a control group mean of 0.5 (SD 0.5), based on the Demographic Health Survey estimate of 58% of children in these regions reported as fully immunized. Trained data collectors from the study team approached each of the randomly selected households in person and conducted an informed consent process with those who were eligible. If no eligible person was available at the household, data collectors visited each household a second time, and, if no eligible person was available again, they then randomly selected another household in the barangay as a replacement. Following informed consent, data collectors verbally administered a face-to-face 65-question survey in English, Tagalog, or an applicable regional language (including Cebuano/Bisaya, Hiligaynon, or Bikol), according to the preference of the participant. When available and with caregiver consent, data collectors verified the reported vaccination history by directly examining the child’s vaccine card. Deidentified data were recorded on a password-protected device, and survey responses were uploaded to a secure web server in accordance with the 2012 Data Privacy Act [21]. Participants with unvaccinated or incompletely vaccinated children were counseled to seek vaccination services from the nearest vaccination site.

Table 1. Summary of sampling, survey administration, survey questions, and weights by survey method: Philippines Regions V and VII, January-March 2025.
Survey methodSamplingAdministrationNumber of questionsAnalysis
HouseholdTwo-stage cluster sampling: Stage 1: barangays selected using probability proportional to size. Stage 2: simple random sample of 8 households with a child under 2 y of age from the Barangay Health Worker (BHW) household listing.In-person, interviewer-administered on a password-protected device.65Base weights for the probability of selecting each barangay and household based on probability proportional to size. Poststratification weights were used to match the marginal totals for strata defined by the combination of educational attainment of the household head, main water source, and region.
MobileTwo-stage cluster sampling: Stage 1: barangays selected using probability proportional to size, drawn from 2020 census data. Stage 2: simple random sample of 2 BHWs per barangay who distributed survey invitation links to households listed in their catchment areas.SMS text messaging invitation link sent by BHWs to eligible households in their catchment areas. Self-administered via SMS text messaging or interactive voice response, as selected by the participant.15Poststratification weights were used to match the marginal totals for strata defined by the combination of educational attainment of the household head, main water source, and region.
OnlineNonprobabilistic respondent-driven internet recruitment sampling; opt-in via Knowledge Informs Responsible Action chatbot.Self-administered online survey15Poststratification weights were used to match the marginal totals for strata defined by the combination of educational attainment of the household head, main water source, and region.

In the mobile phone survey, barangays, the smallest administrative units in each region, were selected using probability proportional to size sampling in the same municipalities, but not the same barangays, as the in-person survey sites. A simple random sample of 2 BHWs per barangay was selected to distribute the mobile survey via SMS text messaging invitation link, using the BHW lists, which contained contact information, including mobile phone numbers, for each of the households with children 2 years of age and below. These BHWs were briefed on the purpose of the survey and asked to share an opt-in survey keyword via SMS text messaging with all eligible individuals using their catchment area household listing. To facilitate this outreach, each BHW received a one-time communication allowance of Php 300 (~US $5.40). As a quality check, BHWs were asked to report the total number of eligible individuals in their catchment area and the number of eligible individuals who had been sent the survey invitation message. The survey was hosted on the EngageSPARK platform, a self-service platform that sends interactive automated phone calls (interactive voice response [IVR] surveys) and 2-way SMS text messaging campaigns and collates survey responses. Recipients of the SMS text messaging survey invitation could opt into the survey by entering the survey shortcode included in the survey invitation to access the EngageSPARK platform. Following an initial screening questionnaire to confirm age, location, and whether there were children 2 years of age or younger in the household, participants were asked to proceed within the survey if they consented to participate. Next, participants could select to proceed in English or Tagalog and via SMS text messaging or by IVR.

In the online survey, we employed nonprobabilistic respondent-driven internet recruitment sampling, where regionally geo-targeted Facebook advertisements encouraged eligible respondents to opt into the survey. Respondents could opt into participating in the online survey through the pre-existing Knowledge Informs Responsible Action chatbot, known locally as Katuwang na Impormasyon para sa Responsableng Aksyon, hosted by the Philippines Department of Health (DOH) [22] and maintained by AI4GOV. The Knowledge Informs Responsible Action chatbot service offers information, collects survey data, and can be accessed through the DOH webpage, DOH Facebook page, and via Viber.

Measures

Across the 3 surveys, we used consistent measures to ascertain eligibility, respondent demographics, child vaccination status, and responses to the BeSD questions.

The interviewer-administered household survey included 16 questions on caregiver demographics, 28 on child demographics and vaccination status, and 21 BeSD questions (Table S1 in Multimedia Appendix 1). Each survey method used consistent questions, and the household survey included additional questions at the behest of local authorities.

The self-administered mobile survey consisted of 15 questions, including 4 demographic questions, 7 core BeSD of vaccination questions, and 4 questions on demographics and vaccination status of the youngest child under 2 years of age in the household (Table S1 in Multimedia Appendix 1). We opted to assess a subset of the household survey questions, in alignment with the number and estimated completion time of other mobile surveys [7].

The self-administered online survey began with a welcome, eligibility confirmation, and informed consent process. Participants completed the 15-question survey in English, Tagalog, or Cebuano/Bisaya (Table S1 in Multimedia Appendix 1). The survey consisted of 4 demographic questions, 7 core BeSD questions, and 4 questions on demographics and vaccination status of the youngest child under 2 years of age in the household.

All responses were according to self-report, although DPT vaccination status was also recorded by card verification when available. Region was classified as either Region V, Region XII, or missing. The age of the respondent at the last birthday was recorded in years. The sex of the respondent was classified as male, female, or missing.

To assess proxies for socioeconomic status, we assessed educational attainment and main water source. Education was classified as did not graduate from elementary, elementary graduate, high school graduate, college graduate, postgraduate studies, or missing, and then collapsed to a binary (completed vs did not complete high school) for the analysis. Water source was classified as piped water, dug well, tube well or borehole, water from a spring, water refilling station, others, or missing.

We classified children under 2 years of age who received zero doses of DPT-containing vaccine as zero-dose, consistent with the literature [23]. Vaccination status was classified as received at least 1 dose of DPT or did not receive DPT. This variable was assessed by vaccine card verification when available in the household survey, and by self-report in all other cases (mobile and internet surveys, plus household survey respondents without available vaccine cards). Self-reported DPT vaccination status was used in the analysis, and a sensitivity analysis compared differences in DPT vaccination status between respondents who provided vaccination card verification and those who provided only self-report.

The BeSD questions were drawn from 7 of the WHO BeSD validated core questions [12] (Table S1 in Multimedia Appendix 1). We assessed confidence in vaccine importance with the question, “How important do you think vaccines are for your child’s health?” with response options of Very important, Moderately important, A little important, and Not at all important. Similarly, confidence in vaccine safety was assessed with the question, “How safe do you think vaccines are for your child?” with response options of Very safe, Moderately safe, A little safe, and Not at all safe. We assessed social norms with the question, “If it was time for your child to get vaccinated, would the mother need permission from any household member to take your child to the clinic?” with response options of Yes or No. In the domain of practical issues, 3 questions assessed access issues. Knowledge of where to access the vaccine was assessed with the question “Do you know where to go to get your child vaccinated?” with response options of Yes and No. Difficulty of access was assessed with the question, “How easy is it to get vaccination services for your child?” with ordinal response options of Not at all easy, A little easy, Moderately easy, and Very easy. Vaccine costs, including opportunity costs, were assessed with the question, “How easy is it to pay for vaccination?” with ordinal response options of Not at all easy, A little easy, Moderately easy, and Very easy. Vaccination intent was assessed with the question, “The Philippines has a schedule of vaccines for children. Do you want your child to get none of these vaccines, some of these vaccines or all of these vaccines?” with response options All, Some, or None.

Statistical Analysis

Statistical analysis was conducted in Stata 18 [24].

Household survey data were analyzed using base weights for the probability of selecting each barangay and household based on the probability proportional to size, drawn from the population projections in the 2020 census. No base weights were incorporated in the mobile or online data, given the sampling design of these methods assumes uniformity across the areas. For mobile and online data, CIs are not reported, as they would not accurately reflect the true uncertainty due to the nonprobabilistic sampling approach; instead, we focus on point estimates while acknowledging the limitations in generalizability [25].

For the household, mobile, and online data, poststratification weights were used to match the marginal totals for strata defined by the combination of educational attainment of the household head, main water source, and region.

We reported respondent demographics, using the raw counts and weighted proportions for each demographic variable.

To assess the proportion of survey respondents whose children had received at least 1 dose of DPT by survey method, we calculated the poststratification weighted proportion of DPT vaccination status for each survey method, as well as the 95% CI for the household survey.

We assessed the distribution for 6 of the core BeSD questions listed above by estimating the weighted proportion of responses for each survey method. We did not assess the seventh BeSD question on vaccination intent, as intent was not directly linked with DPT vaccination status. We further explored the distribution of BeSD responses by DPT vaccination status by estimating the weighted proportion of BeSD responses by vaccination status for each survey method. We report the 95% CIs for the household survey data. We assumed missingness completely at random and used a complete records analysis.

Process Description and Survey Costing

We describe the process of implementing each of the 3 survey methods, assessed programmatic costs, and conducted a data quality assessment for the mobile survey responses. We assessed the financial cost (monetary outlays) from the implementer’s perspective using a retrospective top-down costing approach. Measures were total cost per survey method and the cost per survey response. Financial expenses incurred by the implementer for each of the 3 survey methods were obtained from the implementer’s budget. Costs included in the analysis were all financial costs for personnel time (paid labor), per diem, transportation, equipment, BHW worker incentives, survey participant incentives, venue rental, supplies, and advertising and platform costs for each data collection method. We included costs from these program activities: planning and data collection (survey administration), cleaning, validation, and storage. Costs were assessed in 2025 Philippine Pesos (Php). We used the conversion rate on Wise on April 4, 2025 (US $1=Php 57.01) to calculate the exchange rate from Peso to US dollars, with no inflationary adjustments. These overall costs were divided by the number of survey responses completed to yield the financial cost per completed survey for each survey method. Values are presented in nominal US dollars.

The data quality assessment focused on accuracy of responses at 2 time points. First, a preprocessing assessment identified ineligible or duplicate responses automatically immediately following survey responses. Responses were deemed ineligible if the response included a respondent age below 18 years, a respondent region of residence outside of Philippines Regions V (Bicol) and XII (SOCCSKSARGEN), or a respondent-reported child age above 2 years. Second, we randomly selected 10% of mobile survey respondents for a callback to confirm eligibility. Among the mobile survey respondents who received a callback, we assessed the number flagged for quality concerns, tabulated the weighted proportion for each of the demographics and BeSD responses, and then calculated the Pearson chi-squared design-based F statistic and P values (Table S2 in Multimedia Appendix 1). These statistical analyses were possible within a single survey method, although it was not possible to conduct these analyses between the 3 survey methods, given the variations in sampling and weighting approaches. It was not possible to conduct a quality assessment for the online survey, given that we could not assess duplicate entries from the same source and contact information was not collected for follow-up verification.

Ethical Considerations

This study was assessed and approved by the Philippines Department of Health Single Joint Ethics Review Board (SJREB-2024‐55) on November 18, 2024. This activity was reviewed by the Centers for Disease Control and Prevention (CDC)’s Human Subjects Office, deemed not research, and was conducted consistent with applicable federal law and CDC policy (see, eg, §45 CFR part 46, 21 CFR part 56; 42 USC §241(d); 5 USC §552a; 44 USC §3501 et seq). For the household survey, all participants received Php 100 (~US $1.80) to thank them for their time. For the mobile survey, participants received Php 50 (~US $0.90) upon survey completion. Participants did not receive any remuneration for their participation in the online survey.


Respondent Characteristics

Household, mobile, and online survey methods were implemented during January to March 2025. Among the 1605 households visited for the household survey, 1208 respondents were available, of whom 1206 consented, and 1201 respondents completed the household survey (Figure S1 in Multimedia Appendix 2). Among the estimated 4451 eligible households in the participating CHW’s catchment areas, 2987 consented to participate, of whom 2500 completed the mobile survey. Following quality checks of the mobile survey data, 272 responses were excluded, including 252 perfect duplicates within the same municipality, 18 ineligible upon backchecking, and 2 with incomplete information. Among the 2135 mobile responses retained in the analytic dataset, more than 99% (2135/2153) had been completed via SMS text messaging, compared to IVR (18/2153). Among the online respondents, 26,464 initiated the survey, 996 completed it, but only 396 of them reported that they resided in the 2 eligible regions. The final analytic dataset included 1201 household survey responses, 2153 mobile responses, and 398 online responses, for a total of 3752 responses (Table 2). Excluded respondent demographics differed significantly from the analytic dataset (Table S2 in Multimedia Appendix 1).

Table 2. Demographics of respondents by survey method, by number and weighted proportion: Philippines Regions V (Bicol) and XII (SOCCSKSARGEN), January–March 2025.
VariablesHouseholdMobileOnline
Value, nWeighted % (95% CI)aValue, nWeighted %bValue, nWeighted %b
Region
Region V (Bicol)69753.274953.229658.4
Region XII (SOCCSKSARGEN)50446.8140446.810241.6
Sex
Male554.5 (3.3‐6.1)43317.72015.1
Female114695.5 (93.9‐96.7)171282.337884.9
Missingc8
Age (y)
18‐2532629.3 (26.4‐32.3)107845.43513.7
26‐4072458.3 (54.9‐61.6)94048.734278.6
41‐6013010.6 (9‐12.5)1105.6183
>60211.8 (1.2‐2.9)70.334.7
Missing18
Education
Did not complete high school32126.843637.11724.5
High school graduate87773.2168762.938175.5
Missing330
Main drinking water source
Piped water45539753395639.8
Tube well or borehole13320.132620.11118.2
Dug well497.61677.653.9
Water from a spring838.91848.9268.7
Water refilling station45821.867221.827626.2
Others232.6512.6243.2

aHousehold survey data were analyzed using base weights for the probability of selecting each barangay and household based on the probability proportional to size, drawn from the population projections in the 2020 census, along with poststratification weights to match the marginal totals of each stratum of age and sex to the regional distribution of educational attainment and main water source for each region.

bPoststratification weights were used to match the marginal totals of each stratum of age and sex to the regional distribution of educational attainment and main water source for each region. We do not report CIs for the mobile and online surveys, given that CI estimates would substantially underestimate the true variability due to their sampling approaches.

cEntries with no inputs (zero responses) and entries where weighted proportions cannot be calculated due to zero responses are indicated as “—”.

The unweighted count and weighted proportion of respondents by demographic characteristic varied substantially by survey method. The household survey had the smallest proportion of male respondents at 4.5% (n=55; 95% CI 3.3%‐6.1%), followed by the online survey at 15.1% (n=20), and the mobile survey at 17.7% (n= 433). Age also varied by survey method, with 78.6% (n=342) of online respondents reporting an age between 26 and 40 years, compared with 58.3% (n=724; 95% CI 54.9%‐61.6%) in the household survey and 48.7% (n=940) of respondents in the mobile survey. Online survey respondents were the most likely to be high school graduates at 75.5% (n=381), followed by household respondents at 73.2% (n=877), and mobile respondents at 62.9% (n=1687).

The probability of reaching a household with a child vaccinated against DPT was consistent between household and mobile surveys at 91.8% (n=1090; 95% CI 90%‐93.3%) and 90.3%, respectively, but substantially lower at 85% among online respondents (Figure 1). Among the 93.1% of household survey respondents who gave permission for vaccination card validation, 92.6% (n=1017; 95% CI 91%‐94%) of children received at least 1 dose of DPT vaccine. We found caregiver-reported DPT vaccination status (n=73, 85.2%; 95% CI 76.7%‐91%) was somewhat lower than that of those with card-verified data.

At least 85% of respondents in each survey method reported the highest level of vaccine demand, indicating that vaccines are very important, very safe, supported by family, and that they knew where to access vaccination (Figure 2). In response to the BeSD question on vaccine importance, 94.5% (n=1139; 95% CI 92.4%‐96%) of household survey respondents, 89.3% (n=2258) of mobile respondents, and 92.2% (n=381) of online respondents indicated that vaccines are very important. In response to the BeSD question on vaccine safety, 88.5% (n=1056; 95% CI 86%‐90.6%) of household survey respondents, 89.7% (n=2240) of mobile respondents, and 92.8% (n=386) of online respondents indicated that vaccines are very safe. In response to the BeSD question on family support for vaccination, 97.3% (n=1166; 95% CI 96.1%‐98.2%) of household survey respondents, 97.8% (n=2449) of mobile respondents, and 94.6% (n=390) of online respondents indicated that their families supported vaccination. In response to the BeSD question on knowing where to access vaccination, 99.8% (n=1198; 95% CI 99.2%‐99.9%) of household survey respondents, 96.3% (n=2362) of mobile respondents, and 95.1% (n=393) of online respondents indicated that they knew where to access vaccination.

Figure 1. Diphtheria-pertussis-tetanus vaccination (at least 1 dose) by survey method: Philippines Regions V (Bicol) and XII (SOCCSKSARGEN), January–March 2025. DPT: diphtheria-pertussis-tetanus.

Reported ease of accessing vaccination varied by survey method, with the highest proportion of mobile survey respondents indicating that it was not at all easy to access vaccination (Figure 2). Among household survey respondents, 75.6% (n=899; 95% CI 72.1%‐78.9%) indicated it was very easy to access vaccination, compared with 53.9% (n=1205) of mobile respondents and 72% (n=294) of online respondents. Similarly, reported ease of paying for vaccination varied by survey method, with more than 30% of mobile and online survey respondents indicating it was not at all easy to pay for vaccination, compared with 9.4% (n=112; 95% CI 7.9%‐11.2%) of household respondents. Conversely, 62.1% (n=754; 95% CI 58.3%‐65.7%) of household survey respondents indicated that it was very easy to pay for vaccination, compared with 32.1% (n=744) of mobile respondents and 41.4% (n=169) of online respondents.

The responses to BeSD questions on vaccine importance, vaccine safety, and family support for vaccination did not vary substantially by survey method when the distribution of responses was categorized by DPT vaccination status (Figure S2 in Multimedia Appendix 3).

Figure 2. Behavioral and social drivers of vaccination by survey method: Philippines Regions V (Bicol) and XII (SOCCSKSARGEN), January–March 2025.

Process Description

Among the 2153 responses retained in the analytic dataset for the mobile survey, 348 mobile survey respondents received a quality callback, of whom 117 (33.6%)o JMIR reviewer callbacks indicated that the initial survey response was ineligible or not concordant with the original responses to demographic survey questions. The demographic distribution of the respondents flagged for quality concerns differed significantly from the demographic distributions of those not flagged for quality concerns (Table S2 in Multimedia Appendix 1). However, the BeSD responses for vaccine importance, safety, family support for vaccination, and ease of paying for vaccination did not vary significantly between responses flagged for quality concerns and those that were not flagged for quality concerns (Table S2 in Multimedia Appendix 1).

The financial cost per survey from the implementer’s perspective, in nominal US $, was US $2.10 per survey response for the online survey, US $6.93 for the mobile survey, and US $29.38 for the household survey (Table S3 in Multimedia Appendix 1). Personnel time paid labor and per diem were the main cost drivers for the household survey, respondent incentives were the main cost driver for the mobile survey, and advertising and platform costs were the main cost drivers for the online survey.


Principal Findings

Our study comparing 3 different methods of collecting BeSD of vaccination data in 2 regions of the Philippines found that respondent characteristics varied by method, with mobile survey respondent demographics and child vaccination status closer to household survey responses than online survey responses. Of particular interest, the proportion of caregivers reporting a child unvaccinated against DPT was 9.7% for the household survey, 8.2% for the mobile survey, and 15% for the online survey, indicating that all 3 survey methods were able to access unvaccinated populations, with online respondents reporting the highest levels of unvaccinated children.

Overall, respondents for each survey method indicated high vaccine demand, with more than 85% of respondents in each survey method indicating that vaccines are very important, very safe, supported by family, and that the site to access vaccination was known. Within these overall trends, the mobile survey responses were more closely reflective of the household responses than the online survey responses. However, responses to questions on ease of accessing and paying for vaccination illustrated more heterogeneity in survey responses and between survey methods, with household survey respondents more likely to indicate that it was very easy to access and pay for a vaccine than mobile and online survey respondents. These differential responses to the BeSD questions on challenges could be due to underlying differences in population characteristics, or they could be attributed to differential effects of social desirability bias between the in-person household survey and the self-administered mobile and online surveys.

The cost per survey response increased threefold from online to mobile and fourfold from mobile to household survey. A quality check reassessed 10% of mobile survey responses through callbacks and identified 37 instances of inconsistent responses; a quality assessment did not identify substantive differences in magnitude or significance for 4 of the core BeSD questions between responses flagged in sensitivity checks and those that were not flagged.

Each survey method offered benefits and drawbacks. The household survey generated responses that could be considered representative of their respective regions, though the costs per response were considerably higher than the other 2 methods. The mobile survey could be conducted at a fraction of the cost of the household survey, and it yielded similar responses to the BeSD questions. However, the demographic distribution of the respondents differed substantially from that of the household survey, and the quality assessment identified nearly one-third of the responses as ineligible, based on region or caregiver status. The online survey was the least expensive to implement and also yielded similar responses to the BeSD questions as the other 2 methods. However, online survey respondent demographics were substantially different from those of household survey respondents, and there was no means to ascertain respondent eligibility by regional residence or caregiver status.

The WHO has recommended that all countries incorporate questions from BeSD validated surveys [26] in routine data collection and recommends that those with low or inequitable coverage of childhood immunization conduct national or subnational surveys every 2 to 3 years [27]. These indicators have been deployed in contexts ranging from human papillomavirus vaccination in Zimbabwe [28] to COVID-19 vaccination in the Philippines [29]. However, the routine implementation of these validated survey questions poses challenges in resource-constrained settings, particularly if rapid data collection is needed following an event that causes a rapid drop in vaccine confidence.

Our work adds to the growing literature that compares mobile and online data collection methods to household surveys in LMICs [30]. A Cochrane review identified app-based data collection methods as noninferior to other methods of data collection [31], but there were no studies included from LMICs. In a systematic review comparing survey methods in LMICs, only 3 of the included studies compared remotely delivered surveys to household surveys, of which 2 focused on infant feeding programs [31,32] and a third examined household poverty levels following economic shocks. To our knowledge, this study is the first comparison of survey methods regarding BeSD of vaccination at the community level in an LMIC.

Compared to mobile and online methods, household surveys are typically the most representative of population demographics, with mobile and online data drawing a relatively lower proportion of older respondents, rural respondents, and those with less than a high school education and less relative wealth [10,33,34]. Our study identified similar trends in representativeness by survey method [35], although the key variable of interest, DPT vaccination status, differed between data collection methods, indicating that mobile and online surveys are less reliable than household surveys in ascertaining vaccination status.

The literature suggests that the reliability of responses by survey method depends on the type of question asked [36], though mobile surveys have been demonstrated to be noninferior to in-person data collection in survey responsiveness for questions deemed less sensitive [37]. An analysis comparing sequential calling of IVR and computer-assisted telephone interviewing in Bangladesh and Tanzania found high consistency in response to questions on alcohol consumption type and smoking, moderate consistency for history of diabetes or hypertension, and low consistency for routine activities, including physical activity and diet [38]. In a study comparing farmers in India who were surveyed both in person and by phone, significant differences in production were detected due to differential answers by survey method, rather than differential attrition [33]. Our study identifies inconsistencies in demographics by survey method but identifies similarities in perceptions of vaccine importance, safety, family support, and knowledge of where to access vaccination across survey methods.

We found that the financial cost per survey method differed substantially, with the online survey the least expensive and the household survey the most. These findings are consistent with other studies that evaluate costs per survey method between SMS text messaging, IVR, and computer-assisted telephone interviewing [39] and those that examine costs of including mobile phones in existing survey modalities [40]. Our direct comparison by survey method allows decision-makers to assess the preferred survey method in the context of representativeness, responses to core BeSD questions, and financial costs from the implementer’s perspective.

To our knowledge, our study represents the first direct comparison of different survey methods for BeSD data. The implications of our study illustrate that while the survey respondents substantially differ by data collection method, responses to the BeSD questions were markedly similar across all 3 methods. These findings point to the potential of mobile and online methods to complement household surveys, particularly in contexts where mobile ownership or internet connectivity is high and there is reported homogeneity in vaccine drivers across subpopulations.

Limitations

This study has at least 5 limitations. First, the survey methods did not use the same sampling frame; thus, it was not possible to assess how the underlying population demographics varied between the 3 sampling frames. The differences in sampling frames limit our ability to assess the extent to which differences between surveys were attributable to the underlying population differences in sampling frames vs the differences in respondents by survey type. We aimed to address this limitation by sampling adjacent barangays for the household and mobile survey methods, and we adjusted for key demographic and socioeconomic factors. In addition, we do not draw conclusions on comparative representativeness by survey type due to this limitation. Second, our data do not permit an assessment of response rates for each data method, as the denominator for the mobile surveys and online data was not available. We report only the number of responses and compare the representativeness of respondents between the survey methods. Third, with the exception of vaccination card verification in the household survey method, the responses to the survey questions could not be independently verified and could have been subject to social desirability bias or recall limitations. To help mitigate this, in-person enumerators assured caregivers that responses were anonymous and, when possible, referred to the child’s vaccination card for verification. While the BeSD questions cannot be independently verified, we sought to mitigate any unmeasured social desirability bias through the remote administration methods and by training household survey data collectors in appropriate data collection techniques that mitigate the risk of these biases. In addition, the mobile survey method included a data quality assessment of response accuracy to quantify the proportion of respondents whose responses to demographic questions indicated that they were not eligible by age, region of residence, or being caregivers of children 2 years of age or younger. Fourth, by distributing the mobile survey by BHWs, the mobile survey sampling frame was not structured to reach households that might be missed by household survey methods and could therefore be subject to similar biases in missing transitory or hard-to-reach populations. Furthermore, the mobile survey distribution via BHWs also limits the generalizability of these findings to contexts where trusted community health workers maintain listings of vaccine-eligible households in their catchment areas. While we acknowledge this limitation, we also note that the mobile methods yielded a similar proportion of children unvaccinated against DPT. Finally, we note that the financial costing of survey administration from the implementer’s perspective does not include the economic costs, such as the in-kind labor cost of workers whose salaries are covered by the Department of Health, or the systems cost of integrating BeSD data collection into Department of Health data platforms. Furthermore, financial costs presented here are those reported by the implementer on its budgets and may differ from the actual expenses incurred. Similarly to the survey response rates, costs of each survey method are not directly comparable because their representativeness and length differ.

Conclusions

Although respondent demographics and child vaccination status differed between household, mobile, and online data collection methods, respondents consistently indicated high vaccine demand as well as challenges in accessing vaccination and in the cost of accessing vaccination. Selecting the most suitable survey method requires a consideration of trade-offs between methods, including cost, access to technology, and target demographics of caregivers or children. In areas with high mobile phone ownership and strong mobile and internet connectivity, routine mobile and online vaccine demand data collection methods can supplement periodic household surveys.

Acknowledgments

The authors wish to thank Dr Janis Asuncion Bunoan-Macazo, Dr Tony Mounts, Dr Joseph Breese, Dr Nelly Mejia Gonzalez, the data collection team, Barangay Health Workers, and participants involved in this study. The coauthors attest that we did not use generative AI tools for any part of the study, including study design, study question development, data analysis, figure generation, or manuscript drafting.

Funding

This study was funded by the US Centers for Disease Control and Prevention.

Disclaimer

The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the US Centers for Disease Control and Prevention.

Data Availability

Data can be made available upon request.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Survey questions and response options for each survey method, mobile survey quality assessment, and total financial cost and financial cost per survey response.

DOCX File, 55 KB

Multimedia Appendix 2

Flow diagram of household, mobile, and online survey participants: Philippines Regions V (Bicol) and XII (SOCCSKSARGEN), January–March 2025.

PNG File, 84 KB

Multimedia Appendix 3

Behavioral and social drivers of vaccination by DPT vaccination status and survey method: Philippines Regions V (Bicol) and XII (SOCCSKSARGEN), January–March 2025.

PNG File, 283 KB

  1. Hogan DR, Stevens GA, Hosseinpoor AR, Boerma T. Monitoring universal health coverage within the sustainable development goals: development and baseline data for an index of essential health services. Lancet Glob Health. Feb 2018;6(2):e152-e168. [CrossRef] [Medline]
  2. Brewer NT, Chapman GB, Rothman AJ, Leask J, Kempe A. Increasing vaccination: putting psychological science into action. Psychol Sci Public Interest. Dec 2017;18(3):149-207. [CrossRef] [Medline]
  3. Rego RT, Zhukov Y, Reneau KA, et al. Promoting data harmonization to evaluate vaccine hesitancy in LMICs: approach and applications. BMC Med Res Methodol. Nov 24, 2023;23(1):278. [CrossRef] [Medline]
  4. Cutts FT, Claquin P, Danovaro-Holliday MC, Rhoda DA. Monitoring vaccination coverage: defining the role of surveys. Vaccine. Jul 29, 2016;34(35):4103-4109. [CrossRef] [Medline]
  5. Thomson DR, Bhattarai R, Khanal S, et al. Addressing unintentional exclusion of vulnerable and mobile households in traditional surveys in Kathmandu, Dhaka, and Hanoi: a mixed-methods feasibility study. J Urban Health. Feb 2021;98(1):111-129. [CrossRef] [Medline]
  6. Meyer BD, Mok WKC, Sullivan JX. Household surveys in crisis. J Econ Perspect. Nov 1, 2015;29(4):199-226. [CrossRef]
  7. Gibson DG, Pereira A, Farrenkopf BA, Labrique AB, Pariyo GW, Hyder AA. Mobile phone surveys for collecting population-level estimates in low- and middle-income countries: a literature review. J Med Internet Res. May 5, 2017;19(5):e139. [CrossRef] [Medline]
  8. Collins E, Warren S, Lamke C, Contreras I, Henderson S, Rosenbaum M. Representativeness of remote survey methods in LMICs: a cross-national analysis of pandemic-era studies. SSRN Journal. May 31, 2023. [CrossRef]
  9. Lambrecht I, van Asselt J, Headey D, et al. Can phone surveys be representative in low- and middle-income countries? An application to Myanmar. PLoS One. 2023;18(12):e0296292. [CrossRef] [Medline]
  10. Greenleaf AR, Gibson DG, Khattar C, Labrique AB, Pariyo GW. Building the evidence base for remote data collection in low- and middle-income countries: comparing reliability and accuracy across survey modalities. J Med Internet Res. May 5, 2017;19(5):e140. [CrossRef] [Medline]
  11. Gibson DG, Tamrat T, Mehl G. The state of digital interventions for demand generation in low- and middle-income countries: considerations, emerging approaches, and research gaps. Glob Health Sci Pract. Oct 10, 2018;6(Suppl 1):S49-S60. [CrossRef] [Medline]
  12. Behavioural and social drivers of vaccination: tools and practical guidance for achieving high uptake. World Health Organization; 2022. URL: https://iris.who.int/server/api/core/bitstreams/c9f4a77a-001e-477d-a9d1-b05832820b64/content [Accessed 2025-03-14]
  13. Jäckle A, Roberts C, Lynn P. Assessing the effect of data collection mode on measurement. Int Statistical Rev. Apr 2010;78(1):3-20. [CrossRef]
  14. Burki T. Zero-dose immunization programme reaches milestone. Lancet Infect Dis. Apr 2024;24(4):e228-e229. [CrossRef] [Medline]
  15. Reducing vaccine hesitancy in the Philippines: findings from a survey experiment. World Bank; 2021. URL: https:/​/thedocs.​worldbank.org/​en/​doc/​9b206c064482a4fbb880ee23d6081d52-0070062021/​original/​Vaccine-Hesitancy-World-Bank-Policy-Note-September-2021.​pdf [Accessed 2026-03-14]
  16. Kemp S. Digital 2025: the Philippines. DataReportal. URL: https://datareportal.com/reports/digital-2025-philippines [Accessed 2025-03-14]
  17. 2022 Philippine National Demographic and Health Survey (NDHS): final report. Philippine Statistics Authority (PSA) and ICF; 2023. URL: https://www.dhsprogram.com/pubs/pdf/FR381/FR381.pdf [Accessed 2025-03-14]
  18. Digital development dashboard Philippines. International Telecommunication Union (ITU); 2024. URL: https://www.itu.int/en/ITU-D/Statistics/Documents/DDD/ddd_PHL.pdf [Accessed 2025-03-14]
  19. Phillips DR. Primary health care in the Philippines: banking on the Barangays? Soc Sci Med. 1986;23(10):1105-1117. [CrossRef] [Medline]
  20. Bernadas KA. PBBM urges public to support ‘catch-up’ immunization among Bicol kids. Philippine Information Agency. 2024. URL: https://pia.gov.ph/news/pbbm-urges-public-to-support-catch-up-immunization-among-bicol-kids/ [Accessed 2025-05-14]
  21. Republic act 10173 data privacy act of 2012, Republic of the Philippines Congress of the Philippines Metro Manila Fifteenth Congress Second Regular Session (2013). National Privacy Commission. URL: https://privacy.gov.ph/data-privacy-act/ [Accessed 2025-05-14]
  22. Distor C, Moon MJ. Artificial intelligence and big data in COVID-19 response: lessons from the KIRA chatbot and data management platform of the Department of Health, Philippines. Yonsei University; 2022. URL: https:/​/www.​researchgate.net/​publication/​362620886_Artificial_Intelligence_and_Big_Data_in_COVID-19_Response_Lessons_from_the_KIRA_Chatbot_and_Data_Management_Platform_of_the_Department_of_Health_-_Philippines [Accessed 2026-03-14] [CrossRef]
  23. Wonodi C, Farrenkopf BA. Defining the zero dose child: a comparative analysis of two approaches and their impact on assessing the zero dose burden and vulnerability profiles across 82 low- and middle-income countries. Vaccines (Basel). Sep 28, 2023;11(10):1543. [CrossRef] [Medline]
  24. Stata statistical software: release 16. StataCorp LLC. 2019. URL: https://www.stata.com/stata16/ [Accessed 2022-08-16]
  25. Baker R, Brick JM, Bates NA, et al. Summary report of the AAPOR Task Force on Non-probability Sampling. J Surv Stat Methodol. Nov 1, 2013;1(2):90-143. [CrossRef]
  26. Wiley KE, Levy D, Shapiro GK, et al. A user-centered approach to developing a new tool measuring the behavioural and social drivers of vaccination. Vaccine (Auckl). Oct 8, 2021;39(42):6283-6290. [CrossRef] [Medline]
  27. World Health Organization. Understanding the behavioural and social drivers of vaccine uptake WHO position paper – May 2022. Wkly Epidemiol Rec. 2022;97:209-224. URL: https://iris.who.int/server/api/core/bitstreams/79007951-d442-42fe-a9f6-23455cc457d5/content [Accessed 2025-03-14]
  28. Chuma DM, Machacha R, Adjagba AO, January J. Behavioural and social drivers (BeSD) of HPV vaccination in Zimbabwe: a rapid scoping review of literature. Asian Pac J Cancer Prev. Mar 1, 2025;26(3):775-783. [CrossRef] [Medline]
  29. Silvestre CJ, Sornillo BJT, Endoma V, et al. Newness, unfamiliarity, and cultural beliefs; social and behavioural barriers to COVID-19 vaccination among the Dumagat Remontado, an Indigenous population in the Philippines. Health Place. May 2025;93:103444. [CrossRef] [Medline]
  30. Du X, Wang W, Helena van Velthoven M, et al. mHealth Series: text messaging data collection of infant and young child feeding practice in rural China - a feasibility study. J Glob Health. Dec 2013;3(2):020403. [CrossRef] [Medline]
  31. Marcano Belisario JS, Jamsek J, Huckvale K, O’Donoghue J, Morrison CP, Car J. Comparison of self-administered survey questionnaire responses collected using mobile apps versus other methods. Cochrane Database Syst Rev. Jul 27, 2015;2015(7):MR000042. [CrossRef] [Medline]
  32. Li Y, Wang W, van Velthoven MH, et al. Text messaging data collection for monitoring an infant feeding intervention program in rural China: feasibility study. J Med Internet Res. Dec 4, 2013;15(12):e269. [CrossRef] [Medline]
  33. Anderson E, Lybbert TJ, Shenoy A, Singh R, Stein D. Does survey mode matter? Comparing in-person and phone agricultural surveys in India. J Dev Econ. Jan 2024;166:103199. [CrossRef] [Medline]
  34. Mahfoud Z, Ghandour L, Ghandour B, Mokdad AH, Sibai AM. Cell phone and face-to-face interview responses in population-based surveys. Field methods. Feb 2015;27(1):39-54. [CrossRef]
  35. Rane MS, Kochhar S, Poehlein E, et al. Determinants and trends of COVID-19 vaccine hesitancy and vaccine uptake in a national cohort of US adults: a longitudinal study. Am J Epidemiol. Mar 24, 2022;191(4):570-583. [CrossRef] [Medline]
  36. Ballivian A, Azevedo JP, Durbin W. Using mobile phones for high-frequency data collection. In: Toninelli D, Pinter R, de Pedraz P, editors. Mobile Research Methods: Opportunities and Challenges of Mobile Research Methodologies. Ubiquity Press; 2015:21-39. [CrossRef]
  37. Chasukwa M, Choko AT, Muthema F, et al. Collecting mortality data via mobile phone surveys: a non-inferiority randomized trial in Malawi. PLoS Glob Public Health. 2022;2(8):e0000852. [CrossRef] [Medline]
  38. Pariyo GW, Greenleaf AR, Gibson DG, et al. Does mobile phone survey method matter? Reliability of computer-assisted telephone interviews and interactive voice response non-communicable diseases risk factor surveys in low and middle income countries. PLoS One. 2019;14(4):e0214450. [CrossRef] [Medline]
  39. Lau CQ, Cronberg A, Marks L, Amaya A. In search of the optimal mode for mobile phone surveys in developing countries. a comparison of IVR, SMS, and CATI in Nigeria. Surv Res Methods. 2019;13(3):305-318. [CrossRef]
  40. Vehovar V, Berzelak N, Lozar Manfreda K. Mobile phones in an environment of competing survey modes: applying metric for evaluation of costs and errors. Soc Sci Comput Rev. Aug 2010;28(3):303-318. [CrossRef]


BeSD: behavioral and social drivers
BHW: Barangay Health Worker
CDC: Centers for Disease Control and Prevention
DOH: Department of Health
DPT: diphtheria-pertussis-tetanus
IVR: interactive voice response
LMIC: low- and middle-income country


Edited by Amaryllis Mavragani; submitted 21.Jul.2025; peer-reviewed by Abigail Greenleaf, Sohail Agha; accepted 31.Dec.2025; published 10.Apr.2026.

Copyright

© Kimberly E Bonner, Mikka Hipol, Dominique Sy, Rivandra Royono, Douglas Johnson, Isabel del Rosario, Alice Redfern, Darahlyn Biel-Romualdo, Man Kai Wong, Eugene Lam, Shibani Kulkarni, Kirsten Ward, Romel Lacson, Meng-Yu Chen, James Matthew Miraflor, Devon Ray Pacial, Rowena Bunoan, Hugo Catan, Talya Shragai. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 10.Apr.2026.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.